1 8 M ar 2 01 4 Complex - Valued Autoencoders

نویسنده

  • Zhiqin Lu
چکیده

Autoencoders are unsupervised machine learning circuits, with typically one hidden layer, whose learning goal is to minimize an average distortion measure between inputs and outputs. Linear autoencoders correspond to the special case where only linear transformations between visible and hidden variables are used. While linear autoencoders can be defined over any field, only real-valued linear autoencoders have been studied so far. Here we study complex-valued linear autoencoders where the components of the training vectors and adjustable matrices are defined over the complex field with the L2 norm. We provide simpler and more general proofs that unify the realvalued and complex-valued cases, showing that in both cases the landscape of the error function is invariant under certain groups of transformations. The landscape has no local minima, a family of global minima associated with Principal Component Analysis, and many families of saddle points associated with orthogonal projections onto sub-space spanned by sub-optimal subsets of eigenvectors of the covariance matrix. The theory yields several iterative, convergent, learning algorithms, a clear understanding of the generalization properties of the trained autoencoders, and can equally be applied to the hetero-associative case when external targets are provided. Partial results on deep architecture as well as the differential geometry of autoencoders are also presented. The general framework described here is useful to classify autoencoders and identify general properties that ought to be investigated for each class, illuminating some of the connections between autoencoders, unsupervised learning, clustering, Hebbian learning, and information theory.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

0 A ug 2 01 1 Complex - Valued Autoencoders

Autoencoders are unsupervised machine learning circuits whose learning goal is to minimize a distortion measure between inputs and outputs. Linear autoencoders can be defined over any field and only real-valued linear autoencoder have been studied so far. Here we study complex-valued linear autoencoders where the components of the training vectors and adjustable matrices are defined over the co...

متن کامل

Complex-valued autoencoders

Autoencoders are unsupervised machine learning circuits, with typically one hidden layer, whose learning goal is to minimize an average distortion measure between inputs and outputs. Linear autoencoders correspond to the special case where only linear transformations between visible and hidden variables are used. While linear autoencoders can be defined over any field, only real-valued linear a...

متن کامل

ar X iv : 1 40 1 . 11 55 v 1 [ qu an t - ph ] 6 J an 2 01 4 Two - valued states on Baer ∗ - semigroups

In this paper we develop an algebraic framework that allows us to extend families of two-valued states on orthomodular lattices to Baer ∗semigroups. We apply this general approach to study the full class of two-valued states and the subclass of Jauch-Piron two-valued states on Baer ∗-semigroups.

متن کامل

ar X iv : m at h - ph / 0 31 10 01 v 5 8 D ec 2 00 4 Clifford Valued Differential Forms , Algebraic Spinor Fields , Gravitation , Electromagnetism and ” Unified ” Theories ∗

* This paper is an expanded version of the material containing in [73](math-ph/0407024) and [87](math-ph/0407025), which are published in Int.

متن کامل

ar X iv : m at h - ph / 0 10 90 32 v 1 2 8 Se p 20 01 No zero energy states for the supersymmetric x 2 y 2 potential

We show that the positive supersymmetric matrix-valued differential operator H = px 2 + py 2 + x2y2 + xσ3 + yσ1 has no zero modes, i.e., Hψ = 0 implies ψ = 0.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014